509 research outputs found

    Conceptual Frameworks for Multimodal Social Signal Processing

    Get PDF
    This special issue is about a research area which is developing rapidly. Pentland gave it a name which has become widely used, ‘Social Signal Processing’ (SSP for short), and his phrase provides the title of a European project, SSPnet, which has a brief to consolidate the area. The challenge that Pentland highlighted was understanding the nonlinguistic signals that serve as the basis for “subconscious discussions between humans about relationships, resources, risks, and rewards”. He identified it as an area where computational research had made interesting progress, and could usefully make more

    UK stroke incidence, mortality and cardiovascular risk management 1999–2008: time-trend analysis from the General Practice Research Database

    Get PDF
    Objectives Stroke is a major cause of morbidity and mortality. This study aimed to investigate secular trends in stroke across the UK. Design This study aimed to investigate recent trends in the epidemiology of stroke in the UK. The study was a time-trend analysis from 1999 to 2008 within the UK General Practice Research Database. Outcome measures were incidence and prevalence of stroke, stroke mortality, rate of secondary cardiovascular events, and prescribing of pharmacological therapy for primary and secondary prevention of cardiovascular disease. Results The study cohort included 32 151 patients with a first stroke. Stroke incidence fell by 30%, from 1.48/1000 person-years in 1999 to 1.04/1000 person-years in 2008 (p<0.001). Stroke prevalence increased by 12.5%, from 6.40/1000 in 1999 to 7.20/1000 in 2008 (p<0.001). 56-day mortality after first stroke reduced from 21% in 1999 to 12% in 2008 (p<0.0001). Prescribing of drugs to control cardiovascular risk factors increased consistently over the study period, particularly for lipid lowering agents and antihypertensive agents. In patients with atrial fibrillation, use of anticoagulants prior to first stroke did not increase with increasing stroke risk. Conclusion Stroke incidence in the UK has decreased and survival after stroke has improved in the past 10 years. Improved drug treatment in primary care is likely to be a major contributor to this, with better control of risk factors both before and after incident stroke. There is, however, scope for further improvement in risk factor reduction in high-risk patients with atrial fibrillation

    Digital Health: Implications for Heart Failure Management

    Get PDF
    Digital health encompasses the use of information and communications technology and the use of advanced computing sciences in healthcare. This review covers the application of digital health in heart failure patients, focusing on teleconsultation, remote monitoring and apps and wearables, looking at how these technologies can be used to support care and improve outcomes. Interest in and use of these technologies, particularly teleconsultation, have been accelerated by the coronavirus disease 2019 pandemic. Remote monitoring of heart failure patients, to identify those patients at high risk of hospitalisation and to support clinical stability, has been studied with mixed results. Remote monitoring of pulmonary artery pressure has a consistent effect on reducing hospitalisation rates for patients with moderately severe symptoms and multiparameter monitoring shows promise for the future. Wearable devices and apps are increasingly used by patients for health and lifestyle support. Some wearable technologies have shown promise in AF detection, and others may be useful in supporting self-care and guiding prognosis, but more evidence is required to guide their optimal use. Support for patients and clinicians wishing to use these technologies is important, along with consideration of data validity and privacy and appropriate recording of decision-making

    Framework-Based Qualitative Analysis of Free Responses of Large Language Models: Algorithmic Fidelity

    Full text link
    Today, using Large-scale generative Language Models (LLMs) it is possible to simulate free responses to interview questions like those traditionally analyzed using qualitative research methods. Qualitative methodology encompasses a broad family of techniques involving manual analysis of open-ended interviews or conversations conducted freely in natural language. Here we consider whether artificial "silicon participants" generated by LLMs may be productively studied using qualitative methods aiming to produce insights that could generalize to real human populations. The key concept in our analysis is algorithmic fidelity, a term introduced by Argyle et al. (2023) capturing the degree to which LLM-generated outputs mirror human sub-populations' beliefs and attitudes. By definition, high algorithmic fidelity suggests latent beliefs elicited from LLMs may generalize to real humans, whereas low algorithmic fidelity renders such research invalid. Here we used an LLM to generate interviews with silicon participants matching specific demographic characteristics one-for-one with a set of human participants. Using framework-based qualitative analysis, we showed the key themes obtained from both human and silicon participants were strikingly similar. However, when we analyzed the structure and tone of the interviews we found even more striking differences. We also found evidence of the hyper-accuracy distortion described by Aher et al. (2023). We conclude that the LLM we tested (GPT-3.5) does not have sufficient algorithmic fidelity to expect research on it to generalize to human populations. However, the rapid pace of LLM research makes it plausible this could change in the future. Thus we stress the need to establish epistemic norms now around how to assess validity of LLM-based qualitative research, especially concerning the need to ensure representation of heterogeneous lived experiences.Comment: 46 pages, 5 tables, 5 figure

    Cost-effectiveness of dabigatran etexilate for the prevention of stroke and systemic embolism in UK patients with atrial fibrillation

    Get PDF
    Objective To assess the cost-effectiveness of dabigatran etexilate, a new oral anticoagulant, versus warfarin and other alternatives for the prevention of stroke and systemic embolism in UK patients with atrial fibrillation (AF). Methods A Markov model estimated the cost-effectiveness of dabigatran etexilate versus warfarin, aspirin or no therapy. Two patient cohorts with AF (starting age of <80 and ≥80 years) were considered separately, in line with the UK labelled indication. Modelled outcomes over a lifetime horizon included clinical events, quality-adjusted life years (QALYs), total costs and incremental cost-effectiveness ratios (ICERs). Results Patients treated with dabigatran etexilate experienced fewer ischaemic strokes (3.74 dabigatran etexilate vs 3.97 warfarin) and fewer combined intracranial haemorrhages and haemorrhagic strokes (0.43 dabigatran etexilate vs 0.99 warfarin) per 100 patient-years. Larger differences were observed comparing dabigatran etexilate with aspirin or no therapy. For patients initiating treatment at ages <80 and ≥80 years, the ICERs for dabigatran etexilate were £4831 and £7090/QALY gained versus warfarin with a probability of cost-effectiveness at £20 000/QALY gained of 98% and 63%, respectively. For the patient cohort starting treatment at ages <80 years, the ICER versus aspirin was £3457/QALY gained and dabigatran etexilate was dominant (ie, was less costly and more effective) compared with no therapy. These results were robust in sensitivity analyses. Conclusions This economic evaluation suggests that the use of dabigatran etexilate as a first-line treatment for the prevention of stroke and systemic embolism is likely to be cost-effective in eligible UK patients with AF

    Electronic health records to facilitate clinical research

    Get PDF
    Electronic health records (EHRs) provide opportunities to enhance patient care, embed performance measures in clinical practice, and facilitate clinical research. Concerns have been raised about the increasing recruitment challenges in trials, burdensome and obtrusive data collection, and uncertain generalizability of the results. Leveraging electronic health records to counterbalance these trends is an area of intense interest. The initial applications of electronic health records, as the primary data source is envisioned for observational studies, embedded pragmatic or post-marketing registry-based randomized studies, or comparative effectiveness studies. Advancing this approach to randomized clinical trials, electronic health records may potentially be used to assess study feasibility, to facilitate patient recruitment, and streamline data collection at baseline and follow-up. Ensuring data security and privacy, overcoming the challenges associated with linking diverse systems and maintaining infrastructure for repeat use of high quality data, are some of the challenges associated with using electronic health records in clinical research. Collaboration between academia, industry, regulatory bodies, policy makers, patients, and electronic health record vendors is critical for the greater use of electronic health records in clinical research. This manuscript identifies the key steps required to advance the role of electronic health records in cardiovascular clinical research

    Patient factors associated with titration of medical therapy in patients with heart failure with reduced ejection fraction: data from the QUALIFY international registry

    Get PDF
    Aims: Failure to prescribe key medicines at evidence-based doses is associated with increased mortality and hospitalization forpatients with Heart Failure with reduced Ejection Fraction (HFrEF). We assessed titration patterns of guideline-recommendedHFrEF medicines internationally and explored associations with patient characteristics in the global, prospective, observational,longitudinal registry. Methods and results: Data were collected from September 2013 through December 2014, with 7095 patients from 36 coun-tries [>18 years, previous HF hospitalization within 1–15 months, left ventricular ejection fraction (LVEF)≤40%] enrolled, withdosage data at baseline and up to 18 months from 4368 patients. In 4368 patients (mean age 63 ± 17 years, 75% male)≥100%target doses at baseline: 30.6% (ACEIs), 2.9% (ARBs), 13.9% (BBs), 53.8% (MRAs), 26.2% (ivabradine). Atfinal follow-up,≥100%target doses achieved in more patients for ACEI (34.8%), BB (18.0%), and ivabradine (30.5%) but unchanged for ARBs (3.2%)and MRAs (53.7%). Adjusting for baseline dosage, uptitration during follow-up was more likely with younger age, highersystolic blood pressure, and in absence of chronic kidney disease or diabetes for ACEIs/ARBs; younger age, higher body massindex, higher heart rate, lower LVEF, and absence of coronary artery disease for BBs. For ivabradine, uptitration was morelikely with higher resting heart rate. Conclusions: The international QUALIFY Registry suggests that few patients with HFrEF achieve target doses ofdisease-modifying medication, especially older patients and those with co-morbidity. Quality improvement initiatives are ur-gently required

    A New Strategy for Deep Wide-Field High Resolution Optical Imaging

    Get PDF
    We propose a new strategy for obtaining enhanced resolution (FWHM = 0.12 arcsec) deep optical images over a wide field of view. As is well known, this type of image quality can be obtained in principle simply by fast guiding on a small (D = 1.5m) telescope at a good site, but only for target objects which lie within a limited angular distance of a suitably bright guide star. For high altitude turbulence this 'isokinetic angle' is approximately 1 arcminute. With a 1 degree field say one would need to track and correct the motions of thousands of isokinetic patches, yet there are typically too few sufficiently bright guide stars to provide the necessary guiding information. Our proposed solution to these problems has two novel features. The first is to use orthogonal transfer charge-coupled device (OTCCD) technology to effectively implement a wide field 'rubber focal plane' detector composed of an array of cells which can be guided independently. The second is to combine measured motions of a set of guide stars made with an array of telescopes to provide the extra information needed to fully determine the deflection field. We discuss the performance, feasibility and design constraints on a system which would provide the collecting area equivalent to a single 9m telescope, a 1 degree square field and 0.12 arcsec FWHM image quality.Comment: 46 pages, 22 figures, submitted to PASP, a version with higher resolution images and other supplementary material can be found at http://www.ifa.hawaii.edu/~kaiser/wfhr
    corecore